# Low perplexity
Chinese Llama 2 7b Gguf
Apache-2.0
GGUF-v3 version file of the Chinese LLaMA-2-7B model adapted to llama.cpp
Large Language Model
Transformers Supports Multiple Languages

C
hfl
254
5
Mgpt 1.3B Uzbek
MIT
A 1.3B-parameter language model specifically designed for Uzbek, deeply fine-tuned based on mGPT-XL (1.3B)
Large Language Model
Transformers Supports Multiple Languages

M
ai-forever
118
10
Mgpt 1.3B Mongol
MIT
Mongolian mGPT 1.3B is a language model with 1.3 billion parameters specifically designed for Mongolian, supporting natural language processing tasks related to Mongolian.
Large Language Model
Transformers Supports Multiple Languages

M
ai-forever
1,722
2
Mgpt 13B
MIT
mGPT 13B is a multi-language language model that supports 61 languages, covering 25 language families. It is trained on 600GB of text data and has powerful multi-language processing capabilities.
Large Language Model
Transformers Supports Multiple Languages

M
ai-forever
4,742
49
Gpt2 Base Thai
MIT
Thai causal language model based on GPT-2 architecture, trained on the OSCAR dataset
Large Language Model Other
G
flax-community
1,026
10
Spanish Gpt2
MIT
This is a Spanish GPT-2 model trained from scratch on the large_spanish_corpus (BETO corpus) using the Flax framework, developed with support from the HuggingFace community week event.
Large Language Model Spanish
S
mrm8488
971
19
Gpt2 124M Uk Fiction
A GPT-2 language model trained on Ukrainian novels with 124M parameters, specialized for Ukrainian text generation
Large Language Model Other
G
Tereveni-AI
60
3
Ancient Greek BERT
The first and only available subword BERT model for Ancient Greek, achieving state-of-the-art fine-tuned performance in part-of-speech tagging and morphological analysis tasks.
Large Language Model
Transformers

A
pranaydeeps
214
14
Gpt2 Large Dutch
This is a GPT2 large model (762 million parameters) trained from scratch, focusing on Dutch language, with a perplexity of 15.1 on the clean Dutch mC4 dataset.
Large Language Model Other
G
yhavinga
428
7
Reddit Bert Text2
Apache-2.0
A text processing model fine-tuned based on bert-base-uncased, trained on an unknown dataset, with a validation loss of 2.4969 and perplexity of 12.14
Large Language Model
Transformers

R
flboehm
22
0
Gpt2 Small Turkish
Apache-2.0
This is a fine-tuned version of the GPT2-Small English model, trained on Turkish Wikipedia articles, suitable for Turkish text generation tasks.
Large Language Model Other
G
gorkemgoknar
545
10
Gpt2 Bengali
MIT
Bengali GPT-2 model based on mC4 dataset for text generation tasks
Large Language Model Other
G
flax-community
462
6
Gpt2 Medium Finnish
Apache-2.0
A 345 million parameter GPT-2 model pre-trained on massive Finnish text, excelling in Finnish text generation
Large Language Model Other
G
Finnish-NLP
30
3
Featured Recommended AI Models